Goto

Collaborating Authors

 uncertainty-penalized bayesian information criterion


Adaptation of uncertainty-penalized Bayesian information criterion for parametric partial differential equation discovery

Thanasutives, Pongpisit, Fukui, Ken-ichi

arXiv.org Artificial Intelligence

Data-driven discovery of partial differential equations (PDEs) has emerged as a promising approach for deriving governing physics when domain knowledge about observed data is limited. Despite recent progress, the identification of governing equations and their parametric dependencies using conventional information criteria remains challenging in noisy situations, as the criteria tend to select overly complex PDEs. In this paper, we introduce an extension of the uncertainty-penalized Bayesian information criterion (UBIC), which is adapted to solve parametric PDE discovery problems efficiently without requiring computationally expensive PDE simulations. This extended UBIC uses quantified PDE uncertainty over different temporal or spatial points to prevent overfitting in model selection. The UBIC is computed with data transformation based on power spectral densities to discover the governing parametric PDE that truly captures qualitative features in frequency space with a few significant terms and their parametric dependencies (i.e., the varying PDE coefficients), evaluated with confidence intervals. Numerical experiments on canonical PDEs demonstrate that our extended UBIC can identify the true number of terms and their varying coefficients accurately, even in the presence of noise. The code is available at \url{https://github.com/Pongpisit-Thanasutives/parametric-discovery}.


On uncertainty-penalized Bayesian information criterion

Thanasutives, Pongpisit, Fukui, Ken-ichi

arXiv.org Artificial Intelligence

Graduate School of Information Science and Technology Osaka University Osaka, Japan thanasutives@ai.sanken.osaka-u.ac.jp Ken-ichi Fukui SANKEN (The Institute of Scientific and Industrial Research) Osaka University Osaka, Japan fukui@ai.sanken.osaka-u.ac.jp The uncertainty-penalized information criterion (UBIC) has been proposed as a new model-selection criterion for data-driven partial differential equation (PDE) discovery. In this paper, we show that using the UBIC is equivalent to employing the conventional BIC to a set of overparameterized models derived from the potential regression models of different complexity measures. The result indicates that the asymptotic property of the UBIC and BIC holds indifferently.